Goto

Collaborating Authors

 revisiting perceptron


Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

Neural Information Processing Systems

It has been a long-standing problem to efficiently learn a halfspace using as few labels as possible in the presence of noise. In this work, we propose an efficient Perceptron-based algorithm for actively learning homogeneous halfspaces under the uniform distribution over the unit sphere.


Reviews: Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

Neural Information Processing Systems

It shows that their algorithm learns using a nearly tight number of samples in the random independent noise of bounded rate. Previous work had exponentially worse dependence on the noise rate. In addition, it shows that this algorithm can deal with adversarial noise of sufficiently low rate. The latter result improves polynomially on the sample complexity but requires a stronger condtion on the noise rate. The assumptions in this setting are very strong and as a result are highly unlikely to hold in any realistic problem.


Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

Yan, Songbai, Zhang, Chicheng

Neural Information Processing Systems

It has been a long-standing problem to efficiently learn a halfspace using as few labels as possible in the presence of noise. In this work, we propose an efficient Perceptron-based algorithm for actively learning homogeneous halfspaces under the uniform distribution over the unit sphere. Under the adversarial noise condition \cite{ABL14, KLS09, KKMS08}, where at most a $\tilde \Omega(\epsilon)$ fraction of labels can be flipped, our algorithm achieves a near-optimal label complexity of $\tilde{O}\left(d\ln\frac{1}{\epsilon}\right)$ in time $\tilde{O}\left(\frac{d 2}{\epsilon}\right)$. Furthermore, we show that our active learning algorithm can be converted to an efficient passive learning algorithm that has near-optimal sample complexities with respect to $\epsilon$ and $d$. Papers published at the Neural Information Processing Systems Conference.